DE eng

Search in the Catalogues and Directories

Hits 1 – 5 of 5

1
Self-Supervised Curriculum Learning for Spelling Error Correction ...
BASE
Show details
2
Multi-Head Highly Parallelized LSTM Decoder for Neural Machine Translation ...
BASE
Show details
3
Modeling Task-Aware MIMO Cardinality for Efficient Multilingual Neural Machine Translation ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-short.46 Abstract: Neural machine translation has achieved great success in bilingual settings, as well as in multilingual settings. With the increase of the number of languages, multilingual systems tend to underperform their bilingual counterparts. Model capacity has been found crucial for massively multilingual NMT to support language pairs with varying typological characteristics. Previous work increases the modeling capacity by deepening or widening the Transformer. However, modeling cardinality based on aggregating a set of transformations with the same topology has been proven more effective than going deeper or wider when increasing capacity. In this paper, we propose to efficiently increase the capacity for multilingual NMT by increasing the cardinality. Unlike previous work which feeds the same input to several transformations and merges their outputs into one, we present a Multi-Input-Multi-Output (MIMO) architecture that allows each transformation ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/29pk-ag57
https://underline.io/lecture/25472-modeling-task-aware-mimo-cardinality-for-efficient-multilingual-neural-machine-translation
BASE
Hide details
4
Transformer-based NMT : modeling, training and implementation
Xu, Hongfei. - : Saarländische Universitäts- und Landesbibliothek, 2021
BASE
Show details
5
Probing Word Translations in the Transformer and Trading Decoder for Encoder Layers ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
5
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern